-
Notifications
You must be signed in to change notification settings - Fork 12
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: Add Tool.from_component #159
base: main
Are you sure you want to change the base?
Conversation
Pull Request Test Coverage Report for Build 12434782559Details
💛 - Coveralls |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for the effort you are putting into this PR.
I do have some questions and suggestions.
The requirement was to support components with basic str or native Python types as input. This implementation appears to go beyond that, which is great for flexibility but might become hard to maintain.
- What's the advantage of supporting Pydantic models here? Maybe I am just missing some reasonable use cases...
As I commented in the PR, if I remember correctly, one of the initial requirements was to enable the deserialization of Tools from YAML, which would be feasible if Tools are treated as components. Is it possible? If yes, can we add some tests to cover this?
Given the main goal of this PR, would it make sense to involve someone from the DC team in the review?
msg = ( | ||
"Component has been added in a Pipeline and can't be used to create a Tool. " | ||
"Create Tool from a non-pipeline component instead." | ||
) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can you please explain this?
If I remember correctly, one of the requirements was about deserializing Tools from YAML (which should be feasible if Tools are components). I'm not totally sure...
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, I thought we can have a component declared but not be part of the pipeline. Maybe not, depending on that we can remove this check.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I still don't understand if this is a self-imposed limitation (I don't think so) or there are strong reasons to avoid that. Could you please explain this point further?
The TypeAdapter from pydantic takes care of range of json to object conversion be these objects strings or other native Python types, dataclasses or Pydantic objects. It is in fact, harder and longer codebase to implement a version of Tool.from_component that only supports strings, native Python types and their lists as we need to detect cases we don't support, raise errors and so on. TypeAdapter.validate_python takes conversion of all of these. Unit and integration tests for dataclasses and Pydantic models I included showcase how this is indeed possible with almost no code. We even support our own Document class as show in test examples. Given these newfound benefits of TypeAdapter which enable proper conversion support with 10 lines of well tested pydantic code - I went for that solution although it was not required by the requirements.
Yes, I need to review this part as well.
Yes, @mathislucka is on PTO and he had the most context. Let's wait for him |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I like the simplification you have made.
I left other comments and asked Julian to take a look as well.
from .component_schema import create_tool_parameters_schema | ||
|
||
__all__ = ["create_tool_parameters_schema"] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
- I would not export this function here if possible - see Importing one component of a certain family/module leads to importing all components of the same family/module haystack#8650
- I would prefer to make this method internal and also all others in component_schema.py. They should not be user-facing and if we make them internal, we are then free to change them at any time if needed.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ok makes sense, will do 🙏
Why:
Automates the conversion of Haystack components into LLM tools.
Tool.from_component
for components with str/native python types as input haystack#8630What:
component_schema.py
, that converts component run parameters into a JSON tools schema format.tool.py
to introduce afrom_component
method, enabling the creation ofTool
instances from Haystack components. This makes tool creation and integration into pipelines more dynamic.test_tool_component.py
, validating the conversion and functionality of components as tools, including testing various data types and nested structures.How can it be used:
and the use it as a Tool following the established patterns.
How did you test it:
Notes for the reviewer: